In this paper, we consider asymptotic properties of the support vectormachine (SVM) in high-dimension, low-sample-size (HDLSS) settings. We show thatthe hard-margin linear SVM holds a consistency property in whichmisclassification rates tend to zero as the dimension goes to infinity undercertain severe conditions. We show that the SVM is very biased in HDLSSsettings and its performance is affected by the bias directly. In order toovercome such difficulties, we propose a bias-corrected SVM (BC-SVM). We showthat the BC-SVM gives preferable performances in HDLSS settings. We alsodiscuss the SVMs in multiclass HDLSS settings. Finally, we check theperformance of the classifiers in actual data analyses.
展开▼